Learning Knowledge Graph Embeddings for Natural Language Processing

نویسنده

  • Muhao Chen
چکیده

Knowledge graph embeddings provide powerful latent semantic representation for the structured knowledge in knowledge graphs, which have been introduced recently. Being different from the already widely-used word embeddings that are conceived from plain text, knowledge graph embeddings enable direct explicit relational inferences among entities via simple calculation of embedding vectors. In particular, they are quite effective at highlighting key concepts underlying sophisticated human languages. Therefore knowledge graph embeddings provide potent tools for modern NLP applications, inasmuch as they well-preserve the multi-faceted knowledge and structures of the knowledge graphs. However, recent research efforts have not progressed much beyond representing simple or multi-mapping relations (e.g. one-to-many, many-to-many) for monolingual knowledge graphs. Many crucial problems, including how to preserve important relational properties, and how to characterize both monolingual and cross-lingual knowledge in multiple language-specific versions of the knowledge bases, still remain largely unsolved. Another pressing challenge is, how to incorporate knowledge graph embeddings into NLP tasks which currently rely on word embeddings or other representation techniques. In this prospectus, we first propose new models for encoding the multi-faceted knowledge as stated. We start from investigating the approach that captures cross-lingual transitions across difference language-specific versions of embedding spaces, while in each embedding space the monolingual relations are well-preserved. We then study the approach to retain the important relational properties that commonly exist in domain-specific and ontology-level knowledge graphs, including transitivity, symmetry, and hierarchies. After that, we explore how our new embedding models may be used to improve modern NLP tasks, including relation extraction, knowledge alignment, semantic relatedness analysis, and sentiment analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Multi-faceted Knowledge Graph Embeddings for Natural Language Processing

Knowledge graphs have challenged the existing embedding-based approaches for representing their multifacetedness. To address some of the issues, we have investigated some novel approaches that (i) capture the multilingual transitions on different language-specific versions of knowledge, and (ii) encode the commonly existing monolingual knowledge with important relational properties and hierarch...

متن کامل

ConceptNet 5.5: An Open Multilingual Graph of General Knowledge

Machine learning about language can be improved by supplying it with specific knowledge and sources of external information. We present here a new version of the linked open data resource ConceptNet that is particularly well suited to be used with modern NLP techniques such as word embeddings. ConceptNet is a knowledge graph that connects words and phrases of natural language with labeled edges...

متن کامل

On Graph Mining with Deep Learning: Introducing Model R for Link Weight Prediction

Deep learning has been successful in various domains including image recognition, speech recognition and natural language processing. However, the research on its application in graph mining is still in an early stage. Here we present Model R, a neural network model created to provide a deep learning approach to the link weight prediction problem. This model uses a node embedding technique that...

متن کامل

Temporal Reasoning Over Event Knowledge Graphs

Many advances in the computer science field, such as semantic search, recommendation systems, question-answering, natural language processing, are drawn-out using the help of large scale knowledge bases (e.g., YAGO, NELL, DBPedia). However, many of these knowledge bases are static representations of knowledge and do not model time on its own dimension or do it only for a small portion of the gr...

متن کامل

Hybed: Hyperbolic Neural Graph Embedding

Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017